Supplementary Material Random Feature Expansions for Deep Gaussian Processes

نویسندگان

  • Kurt Cutajar
  • Edwin V. Bonilla
  • Pietro Michiardi
  • Maurizio Filippone
چکیده

Random Feature Expansions for Deep Gaussian Processes Kurt Cutajar 1 Edwin V. Bonilla 2 Pietro Michiardi 1 Maurizio Filippone 1 A. Additional Experiments Using the experimental set-up described in Section 4, Figure 1 demonstrates how the competing models perform with regards to the RMSE (or error rate) and MNLL metric when two hidden layers are incorporated into the competing models. The results follow a similar progression to those reported in Figure 3 of the main paper. The DGP-ARC and DGP-RBF models both continue to perform well after introducing this additional layer. However, the results for the regularized DNN are notably inferior, and the degree of overfitting is also much greater. To this end, the MNLL obtained for the MNIST dataset is not shown in the plot as it was vastly inferior to the values obtained using the other methods. DGP-EP was also observed to have low scalability in this regard whereby it was not possible to obtain sensible results for the MNIST dataset using this configuration.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Random Feature Expansions for Deep Gaussian Processes

The composition of multiple Gaussian Processes as a Deep Gaussian Process (DGP) enables a deep probabilistic nonparametric approach to flexibly tackle complex machine learning problems with sound quantification of uncertainty. Existing inference approaches for DGP models have limited scalability and are notoriously cumbersome to construct. In this work we introduce a novel formulation of DGPs b...

متن کامل

Expansions for Gaussian processes and Parseval frames

We derive a precise link between series expansions of Gaussian random vectors in a Banach space and Parseval frames in their reproducing kernel Hilbert space. The results are applied to pathwise continuous Gaussian processes and a new optimal expansion for fractional OrnsteinUhlenbeck processes is derived.

متن کامل

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

Latent Stick-Breaking Processes.

We develop a model for stochastic processes with random marginal distributions. Our model relies on a stick-breaking construction for the marginal distribution of the process, and introduces dependence across locations by using a latent Gaussian copula model as the mechanism for selecting the atoms. The resulting latent stick-breaking process (LaSBP) induces a random partition of the index spac...

متن کامل

Supplementary Material: Deep Adaptive Image Clustering

This is the supplementary material for the paper entitled “Deep Adaptive Image Clustering”. The supplementary material is organized as follows. Section 1 gives the mapping function described in Figure 1. Section 2 presents the proof of Theorem 1. Section 3 details the experimental settings in our experiments. 1. The Mapping Function Utilized in Figure 1 We assume that li represents the label fe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017